Bayes Risk Consistency of Large Margin Binary Classification Methods - A Survey

نویسنده

  • Paramveer S. Dhillon
چکیده

In this paper we survey the Bayes Risk Consistency of various Large Margin Binary Classifiers. Many classification algorithms minimize a tractable convex surrogate φ of the 0-1 loss function. For e.g. the SVM (Support Vector Machine) and AdaBoost, which minimize the hinge loss and the exponential loss respectively. By imposing some sort of regularization conditions, it is possible to demonstrate the Bayes-risk consistency of methods based on minimizing the convex surrogate of the intractable 0-1 loss function. We have surveyed four papers which present such results in the binary classification setting, namely [BJM04], [Jia00], [Zha04] and [LV04].

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Statistical Analysis of Some Multi-Category Large Margin Classification Methods

The purpose of this paper is to investigate statistical properties of risk minimization based multicategory classification methods. These methods can be considered as natural extensions of binary large margin classification. We establish conditions that guarantee the consistency of classifiers obtained in the risk minimization framework with respect to the classification error. Examples are pro...

متن کامل

An Infinity-sample Theory for Multi-category Large Margin Classification

The purpose of this paper is to investigate infinity-sample properties of risk minimization based multi-category classification methods. These methods can be considered as natural extensions to binary large margin classification. We establish conditions that guarantee the infinity-sample consistency of classifiers obtained in the risk minimization framework. Examples are provided for two specif...

متن کامل

Margin Adaptive Risk Bounds for Classification Trees

Margin adaptive risk bounds for Classification and Regression Trees (CART, Breiman et. al. 1984) classifiers are obtained in the binary supervised classification framework. These risk bounds are obtained conditionally on the construction of the maximal deep binary tree and permit to prove that the linear penalty used in the CART pruning algorithm is valid under margin condition. It is also show...

متن کامل

Variable margin losses for classifier design

The problem of controlling the margin of a classifier is studied. A detailed analytical study is presented on how properties of the classification risk, such as its optimal link and minimum risk functions, are related to the shape of the loss, and its margin enforcing properties. It is shown that for a class of risks, denoted canonical risks, asymptotic Bayes consistency is compatible with simp...

متن کامل

Adaptive Sampling Under Low Noise Conditions

We survey some recent results on efficient margin-based algorithms for adaptive sampling in binary classification tasks. Using the so-called Mammen-Tsybakov low noise condition to parametrize the distribution of covariates, and assuming linear label noise, we state bounds on the convergence rate of the adaptive sampler to the Bayes risk. These bounds show that, excluding logarithmic factors, th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008